The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

نویسندگان

  • Laurel J. Gabard-Durnam
  • Adriana S. Mendez Leal
  • Carol L. Wilkinson
  • April R. Levin
چکیده

Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

بررسی ویژگی‌های روان‌سنجی نیم‌رخ حسی دان فرم مدرسه

Objective Sensory processing refers to reception, adjustment, and integration of sensory information sequentially and generates adaptive responses. People need to have appropriate sensory processing abilities for functioning adequately in the environment and be able to participate in activities of daily living. Dunn's sensory processing model consists of two constructions. The first constructio...

متن کامل

Fragmentation measurement using image processing

In this research, first of all, the existing problems in fragmentation measurement are reviewed for the sake of its fast and reliable evaluation. Then, the available methods used for evaluation of blast results are mentioned. The produced errors especially in recognizing the rock fragments in computer-aided methods, and also, the importance of determination of their sizes in the image analysis ...

متن کامل

The PREP pipeline: standardized preprocessing for large-scale EEG analysis

The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise rat...

متن کامل

An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data

Large amounts of multimodal neuroimaging data are acquired every year worldwide. In order to extract high-dimensional information for computational neuroscience applications standardized data fusion and efficient reduction into integrative data structures are required. Such self-consistent multimodal data sets can be used for computational brain modeling to constrain models with individual meas...

متن کامل

EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets

Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره 12  شماره 

صفحات  -

تاریخ انتشار 2018